65 research outputs found

    A Bayesian method for calculating real-time quantitative PCR calibration curves using absolute plasmid DNA standards

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>In real-time quantitative PCR studies using absolute plasmid DNA standards, a calibration curve is developed to estimate an unknown DNA concentration. However, potential differences in the amplification performance of plasmid DNA compared to genomic DNA standards are often ignored in calibration calculations and in some cases impossible to characterize. A flexible statistical method that can account for uncertainty between plasmid and genomic DNA targets, replicate testing, and experiment-to-experiment variability is needed to estimate calibration curve parameters such as intercept and slope. Here we report the use of a Bayesian approach to generate calibration curves for the enumeration of target DNA from genomic DNA samples using absolute plasmid DNA standards.</p> <p>Results</p> <p>Instead of the two traditional methods (classical and inverse), a Monte Carlo Markov Chain (MCMC) estimation was used to generate single, master, and modified calibration curves. The mean and the percentiles of the posterior distribution were used as point and interval estimates of unknown parameters such as intercepts, slopes and DNA concentrations. The software WinBUGS was used to perform all simulations and to generate the posterior distributions of all the unknown parameters of interest.</p> <p>Conclusion</p> <p>The Bayesian approach defined in this study allowed for the estimation of DNA concentrations from environmental samples using absolute standard curves generated by real-time qPCR. The approach accounted for uncertainty from multiple sources such as experiment-to-experiment variation, variability between replicate measurements, as well as uncertainty introduced when employing calibration curves generated from absolute plasmid DNA standards.</p

    Improving Phase Change Memory Performance with Data Content Aware Access

    Full text link
    A prominent characteristic of write operation in Phase-Change Memory (PCM) is that its latency and energy are sensitive to the data to be written as well as the content that is overwritten. We observe that overwriting unknown memory content can incur significantly higher latency and energy compared to overwriting known all-zeros or all-ones content. This is because all-zeros or all-ones content is overwritten by programming the PCM cells only in one direction, i.e., using either SET or RESET operations, not both. In this paper, we propose data content aware PCM writes (DATACON), a new mechanism that reduces the latency and energy of PCM writes by redirecting these requests to overwrite memory locations containing all-zeros or all-ones. DATACON operates in three steps. First, it estimates how much a PCM write access would benefit from overwriting known content (e.g., all-zeros, or all-ones) by comprehensively considering the number of set bits in the data to be written, and the energy-latency trade-offs for SET and RESET operations in PCM. Second, it translates the write address to a physical address within memory that contains the best type of content to overwrite, and records this translation in a table for future accesses. We exploit data access locality in workloads to minimize the address translation overhead. Third, it re-initializes unused memory locations with known all-zeros or all-ones content in a manner that does not interfere with regular read and write accesses. DATACON overwrites unknown content only when it is absolutely necessary to do so. We evaluate DATACON with workloads from state-of-the-art machine learning applications, SPEC CPU2017, and NAS Parallel Benchmarks. Results demonstrate that DATACON significantly improves system performance and memory system energy consumption compared to the best of performance-oriented state-of-the-art techniques.Comment: 18 pages, 21 figures, accepted at ACM SIGPLAN International Symposium on Memory Management (ISMM

    Evolutionary Game Theory and Social Learning Can Determine How Vaccine Scares Unfold

    Get PDF
    Immunization programs have often been impeded by vaccine scares, as evidenced by the measles-mumps-rubella (MMR) autism vaccine scare in Britain. A “free rider” effect may be partly responsible: vaccine-generated herd immunity can reduce disease incidence to such low levels that real or imagined vaccine risks appear large in comparison, causing individuals to cease vaccinating. This implies a feedback loop between disease prevalence and strategic individual vaccinating behavior. Here, we analyze a model based on evolutionary game theory that captures this feedback in the context of vaccine scares, and that also includes social learning. Vaccine risk perception evolves over time according to an exogenously imposed curve. We test the model against vaccine coverage data and disease incidence data from two vaccine scares in England & Wales: the whole cell pertussis vaccine scare and the MMR vaccine scare. The model fits vaccine coverage data from both vaccine scares relatively well. Moreover, the model can explain the vaccine coverage data more parsimoniously than most competing models without social learning and/or feedback (hence, adding social learning and feedback to a vaccine scare model improves model fit with little or no parsimony penalty). Under some circumstances, the model can predict future vaccine coverage and disease incidence—up to 10 years in advance in the case of pertussis—including specific qualitative features of the dynamics, such as future incidence peaks and undulations in vaccine coverage due to the population's response to changing disease incidence. Vaccine scares could become more common as eradication goals are approached for more vaccine-preventable diseases. Such models could help us predict how vaccine scares might unfold and assist mitigation efforts

    A new real-time PCR method to overcome significant quantitative inaccuracy due to slight amplification inhibition

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Real-time PCR analysis is a sensitive DNA quantification technique that has recently gained considerable attention in biotechnology, microbiology and molecular diagnostics. Although, the cycle-threshold (<it>Ct</it>) method is the present "gold standard", it is far from being a standard assay. Uniform reaction efficiency among samples is the most important assumption of this method. Nevertheless, some authors have reported that it may not be correct and a slight PCR efficiency decrease of about 4% could result in an error of up to 400% using the <it>Ct </it>method. This reaction efficiency decrease may be caused by inhibiting agents used during nucleic acid extraction or copurified from the biological sample.</p> <p>We propose a new method (<it>Cy</it><sub><it>0</it></sub>) that does not require the assumption of equal reaction efficiency between unknowns and standard curve.</p> <p>Results</p> <p>The <it>Cy</it><sub><it>0 </it></sub>method is based on the fit of Richards' equation to real-time PCR data by nonlinear regression in order to obtain the best fit estimators of reaction parameters. Subsequently, these parameters were used to calculate the <it>Cy</it><sub><it>0 </it></sub>value that minimizes the dependence of its value on PCR kinetic.</p> <p>The <it>Ct</it>, second derivative (<it>Cp</it>), sigmoidal curve fitting method (<it>SCF</it>) and <it>Cy</it><sub><it>0 </it></sub>methods were compared using two criteria: precision and accuracy. Our results demonstrated that, in optimal amplification conditions, these four methods are equally precise and accurate. However, when PCR efficiency was slightly decreased, diluting amplification mix quantity or adding a biological inhibitor such as IgG, the <it>SCF</it>, <it>Ct </it>and <it>Cp </it>methods were markedly impaired while the <it>Cy</it><sub><it>0 </it></sub>method gave significantly more accurate and precise results.</p> <p>Conclusion</p> <p>Our results demonstrate that <it>Cy</it><sub><it>0 </it></sub>represents a significant improvement over the standard methods for obtaining a reliable and precise nucleic acid quantification even in sub-optimal amplification conditions overcoming the underestimation caused by the presence of some PCR inhibitors.</p

    Circulating tumour DNA is a promising biomarker for risk stratification of central chondrosarcoma with IDH1/2 and GNAS mutations.

    Get PDF
    Chondrosarcoma (CS) is a rare tumour type and the most common primary malignant bone cancer in adults. The prognosis, currently based on tumour grade, imaging and anatomical location, is not reliable and more objective biomarkers are required. We aimed to determine whether the level of circulating tumour DNA (ctDNA) in the blood of CS patients could be used to predict outcome. In this multi-institutional study, we recruited 145 patients with cartilaginous tumours, of which 41 were excluded. ctDNA levels were assessed in 83 of the remaining 104 patients, whose tumours harboured a hotspot mutation in IDH1/2 or GNAS. ctDNA was detected pre-operatively in 31/83 (37%) and in 12/31 (39%) patients post-operatively. We found that detection of ctDNA was more accurate than pathology for identification of high-grade tumours and was associated with a poor prognosis; ctDNA was never associated with CS grade 1/atypical cartilaginous tumours (ACT), which are neoplasms sited in the small bones of the hands and feet or in tumours measuring less than 80 mm. Although the results are promising, they are based on a small number of patients and therefore introduction of this blood test into clinical practice as a complementary assay to current standard-of-care protocols would allow the assay to be assessed more stringently and developed for a more personalised approach for the treatment of patients with CS

    Aging-Aware Request Scheduling for Non-Volatile Main Memory

    Full text link
    Modern computing systems are embracing non-volatile memory (NVM) to implement high-capacity and low-cost main memory. Elevated operating voltages of NVM accelerate the aging of CMOS transistors in the peripheral circuitry of each memory bank. Aggressive device scaling increases power density and temperature, which further accelerates aging, challenging the reliable operation of NVM-based main memory. We propose HEBE, an architectural technique to mitigate the circuit aging-related problems of NVM-based main memory. HEBE is built on three contributions. First, we propose a new analytical model that can dynamically track the aging in the peripheral circuitry of each memory bank based on the bank's utilization. Second, we develop an intelligent memory request scheduler that exploits this aging model at run time to de-stress the peripheral circuitry of a memory bank only when its aging exceeds a critical threshold. Third, we introduce an isolation transistor to decouple parts of a peripheral circuit operating at different voltages, allowing the decoupled logic blocks to undergo long-latency de-stress operations independently and off the critical path of memory read and write accesses, improving performance. We evaluate HEBE with workloads from the SPEC CPU2017 Benchmark suite. Our results show that HEBE significantly improves both performance and lifetime of NVM-based main memory.Comment: To appear in ASP-DAC 202

    The spine in Paget’s disease

    Get PDF
    Paget’s disease (PD) is a chronic metabolically active bone disease, characterized by a disturbance in bone modelling and remodelling due to an increase in osteoblastic and osteoclastic activity. The vertebra is the second most commonly affected site. This article reviews the various spinal pathomechanisms and osseous dynamics involved in producing the varied imaging appearances and their clinical relevance. Advanced imaging of osseous, articular and bone marrow manifestations of PD in all the vertebral components are presented. Pagetic changes often result in clinical symptoms including back pain, spinal stenosis and neural dysfunction. Various pathological complications due to PD involvement result in these clinical symptoms. Recognition of the imaging manifestations of spinal PD and the potential complications that cause the clinical symptoms enables accurate assessment of patients prior to appropriate management

    A new MRI rating scale for progressive supranuclear palsy and multiple system atrophy: validity and reliability

    Get PDF
    AIM To evaluate a standardised MRI acquisition protocol and a new image rating scale for disease severity in patients with progressive supranuclear palsy (PSP) and multiple systems atrophy (MSA) in a large multicentre study. METHODS The MRI protocol consisted of two-dimensional sagittal and axial T1, axial PD, and axial and coronal T2 weighted acquisitions. The 32 item ordinal scale evaluated abnormalities within the basal ganglia and posterior fossa, blind to diagnosis. Among 760 patients in the study population (PSP = 362, MSA = 398), 627 had per protocol images (PSP = 297, MSA = 330). Intra-rater (n = 60) and inter-rater (n = 555) reliability were assessed through Cohen's statistic, and scale structure through principal component analysis (PCA) (n = 441). Internal consistency and reliability were checked. Discriminant and predictive validity of extracted factors and total scores were tested for disease severity as per clinical diagnosis. RESULTS Intra-rater and inter-rater reliability were acceptable for 25 (78%) of the items scored (≥ 0.41). PCA revealed four meaningful clusters of covarying parameters (factor (F) F1: brainstem and cerebellum; F2: midbrain; F3: putamen; F4: other basal ganglia) with good to excellent internal consistency (Cronbach α 0.75-0.93) and moderate to excellent reliability (intraclass coefficient: F1: 0.92; F2: 0.79; F3: 0.71; F4: 0.49). The total score significantly discriminated for disease severity or diagnosis; factorial scores differentially discriminated for disease severity according to diagnosis (PSP: F1-F2; MSA: F2-F3). The total score was significantly related to survival in PSP (p<0.0007) or MSA (p<0.0005), indicating good predictive validity. CONCLUSIONS The scale is suitable for use in the context of multicentre studies and can reliably and consistently measure MRI abnormalities in PSP and MSA. Clinical Trial Registration Number The study protocol was filed in the open clinical trial registry (http://www.clinicaltrials.gov) with ID No NCT00211224
    corecore